7,645 research outputs found

    Biomechanical comparison of the track start and the modified one-handed track start in competitive swimming: an intervention study

    Get PDF
    This study compared the conventional track and a new one-handed track start in elite age group swimmers to determine if the new technique had biomechanical implications on dive performance. Five male and seven female GB national qualifiers participated (mean ± SD: age 16.7 ± 1.9 years, stretched stature 1.76 ± 0.8 m, body mass 67.4 ± 7.9 kg) and were assigned to a control group (n = 6) or an intervention group (n = 6) that learned the new onehanded dive technique. All swimmers underwent a 4-week intervention comprising 12 ± 3 thirty-minute training sessions. Video cameras synchronized with an audible signal and timing suite captured temporal and kinematic data. A portable force plate and load cell handrail mounted to a swim starting block collected force data over 3 trials of each technique. A MANCOVA identified Block Time (BT), Flight Time (FT), Peak Horizontal Force of the lower limbs (PHF) and Horizontal Velocity at Take-off (Vx) as covariates. During the 10-m swim trial, significant differences were found in Time to 10 m (TT10m), Total Time (TT), Peak Vertical Force (PVF), Flight Distance (FD), and Horizontal Velocity at Take-off (Vx) (p < .05). Results indicated that the conventional track start method was faster over 10 m, and therefore may be seen as a superior start after a short intervention. During training, swimmers and coaches should focus on the most statistically significant dive performance variables: peak horizontal force and velocity at take-off, block and flight time

    "The Fed's Real Reaction Function: Monetary Policy, Inflation, Unemployment, Inequality-and Presidential Politics"

    Get PDF
    Using a VAR model of the American economy from 1984 to 2003, we find that, contrary to official claims, the Federal Reserve does not target inflation or react to "inflation signals." Rather, the Fed reacts to the very "real" signal sent by unemployment, in a way that suggests that a baseless fear of full employment is a principal force behind monetary policy. Tests of variations in the workings of a Taylor Rule, using dummy variable regressions, on data going back to 1969 suggest that after 1983 the Federal Reserve largely ceased reacting to inflation or high unemployment, but continued to react when unemployment fell "too low." Further, we find that monetary policy (measured by the yield curve) has significant causal impact on pay inequality-a domain where the Fed refuses responsibility. Finally, we test whether Federal Reserve policy has exhibited a pattern of partisan bias in presidential election years, with results that suggest the presence of such bias, after controlling for the effects of inflation and unemployment.

    \u3cem\u3eLinkletter, Shott,\u3c/em\u3e and the Retroactivity Problem in \u3cem\u3eEscobedo\u3c/em\u3e

    Get PDF
    Prior to the 1964 Supreme Court Term, decisions promulgating new constitutional rules were applied retroactively as a matter of course to final convictions. While dissents occasionally criticized the Court\u27s failure to discuss the retroactive impact of a new constitutional rule, the potential effect upon final convictions of any single rule was not sufficiently acute to justify a departure from the normal grant of retroactivity. But the Court\u27s decision in Mapp v. Ohio; which abruptly overturned Wolf v. Colorado and brought into doubt final state convictions resting upon illegally seized evidence admitted in reliance upon Wolf, caused courts and commentators alike to question the necessity for retroactivity in every case. Subsequently, in Linkletter v. Walker, the Court announced a new policy on the issue of retroactivity and refused to give Mapp retroactive effect. Once the premise is accepted that we are neither required to apply, nor prohibited from applying, a decision retrospectively, stated the Court, we must then weigh the merits and demerits in each case by looking to the prior history of the rule in question ... and whether retrospective operation will further or retard its operation.\u27\u2

    Tuning biexciton binding and anti-binding in core/shell quantum dots

    Get PDF
    We use a path integral quantum Monte Carlo method to simulate excitons and biexcitons in core shell nanocrystals with Type-I, II and quasi-Type II band alignments. Quantum Monte Carlo techniques allow for all quantum correlations to be included when determining the thermal ground state, thus producing accurate predictions of biexciton binding. These subtle quantum correlations are found to cause the biexciton to be binding with Type-I carrier localization and strongly anti-binding with Type-II carrier localization, in agreement with experiment for both core shell nanocrystals and dot in rod nanocrystal structures. Simple treatments based on perturbative approaches are shown to miss this important transition in the biexciton binding. Understanding these correlations offers prospects to engineer strong biexciton anti-binding which is crucial to the design of nanocrystals for single exciton lasing applications.Comment: 10 pages, 11 figure

    An Economic Study of the Effect of Android Platform Fragmentation on Security Updates

    Full text link
    Vendors in the Android ecosystem typically customize their devices by modifying Android Open Source Project (AOSP) code, adding in-house developed proprietary software, and pre-installing third-party applications. However, research has documented how various security problems are associated with this customization process. We develop a model of the Android ecosystem utilizing the concepts of game theory and product differentiation to capture the competition involving two vendors customizing the AOSP platform. We show how the vendors are incentivized to differentiate their products from AOSP and from each other, and how prices are shaped through this differentiation process. We also consider two types of consumers: security-conscious consumers who understand and care about security, and na\"ive consumers who lack the ability to correctly evaluate security properties of vendor-supplied Android products or simply ignore security. It is evident that vendors shirk on security investments in the latter case. Regulators such as the U.S. Federal Trade Commission have sanctioned Android vendors for underinvestment in security, but the exact effects of these sanctions are difficult to disentangle with empirical data. Here, we model the impact of a regulator-imposed fine that incentivizes vendors to match a minimum security standard. Interestingly, we show how product prices will decrease for the same cost of customization in the presence of a fine, or a higher level of regulator-imposed minimum security.Comment: 22nd International Conference on Financial Cryptography and Data Security (FC 2018

    Modelling of Trailing Edge Separation on Arbitary Two-Dimensional Aerofoils in Incompressible Flow Using an Inviscid Flow Algorithm. G.U. Aero Report 8202

    Get PDF
    An algorithm for estimating the lift, moment and pressure distribution on arbitary two dimensional aerofoils in incompressible flow is presented. The procedure uses an inviscid analysis of the physics of the real flow, which invokes the application of a linear vortex panel model. The separated wake geometry is determined iteratively, starting from an initial assumption. A boundary layer analysis is not performed, hence the upper surface separation point is a necessary input to the algorithm. Lower surface separation is assumed to occur at the trailing edge. A selection of results and comparison with experimental data is presented. The scatter in the calculated and experimental data values is attributed mainly to the lack of boundary layer displacement and compressibility effects. A fortran code listing of the algorithm is given in the Appendix

    Regulation strength and technology creep play key roles in global long-term projections of wild capture fisheries

    Get PDF
    Unidad de excelencia María de Maeztu CEX2019-000940-MIdentificadors digitals: Digital object identifier for the 'European Research Council' (http://dx.doi.org/10.13039/501100000781) Digital object identifier for 'Horizon 2020' (http://dx.doi.org/10.13039/501100007601) - BIGSEA projectMany studies have shown that the global fish catch can only be sustained with effective regulation that restrains overfishing. However, the persistence of weak or ineffective regulation in many parts of the world, coupled with changing technologies and additional stressors like climate change, renders the future of global catches uncertain. Here, we use a spatially resolved, bio-economic size-spectrum model to shed light on the interactive impacts of three globally important drivers over multidecadal timescales: imperfect regulation, technology-driven catchability increase, and climate change. We implement regulation as the adjustment of fishing towards a target level with some degree of effectiveness and project a range of possible trajectories for global fisheries. We find that if technological progress continues apace, increasingly effective regulation is required to prevent overfishing, akin to a Red Queen race. Climate change reduces the possible upper bound for global catches, but its economic impacts can be offset by strong regulation. Ominously, technological progress under weak regulation masks a progressive erosion of fish biomass by boosting profits and generating a temporary stabilization of global catches. Our study illustrates the large degree to which the long-term outlook of global fisheries can be improved by continually strengthening fisheries regulation, despite the negative impacts of climate change

    A Review of the Susceptibility of the Scalpay Bridge to Aerodynamic Effects. G.U. Aero Report 9512.

    Get PDF
    An independent review is presented of the procedures employed by Crouch, Hogg &amp; Waterman (CHW) in their assessment of the likely aerodynamic effects on the proposed Scalpay Bridge. The review identifies the principal design criteria relevant to the aerodynamic and structural dynamic performance of the Scalpay Bridge as prescribed in BD 49/93. On the basis of data and information supplied by CHW, an assessment is made of the degree to which these criteria are satisfied. It is concluded that there are several sensitive areas in the design analyses undertaken by CHW that should be reconsidered in view of the apparent susceptibility of the bridge to aerodynamic effects. In particular, it is recommended that the response of the bridge to vortex excitation and turbulence, and the narrow stability margin against galloping, be investigated furthe

    Modelling of Trailing Edge Separation on Arbitary Two-Dimensional Aerofoils in Incompressible Flow Using an Inviscid Flow Algorithm. G.U. Aero Report 8202

    Get PDF
    An algorithm for estimating the lift, moment and pressure distribution on arbitary two dimensional aerofoils in incompressible flow is presented. The procedure uses an inviscid analysis of the physics of the real flow, which invokes the application of a linear vortex panel model. The separated wake geometry is determined iteratively, starting from an initial assumption. A boundary layer analysis is not performed, hence the upper surface separation point is a necessary input to the algorithm. Lower surface separation is assumed to occur at the trailing edge. A selection of results and comparison with experimental data is presented. The scatter in the calculated and experimental data values is attributed mainly to the lack of boundary layer displacement and compressibility effects. A fortran code listing of the algorithm is given in the Appendix

    ADCP measurements from the ICESHELF 94 experiment

    Get PDF
    The ICESHELF 94 Experiment was conducted during April of 1994 from an ice camp in the Lincoln Sea at approximately 84 degrees N, 63 degrees W. An Acoustic Doppler Current Profiler (ADCP) was operated at the camp from 8 to 24 April. This report describes the ADCP configuration and presents the raw data recorded by the ADCP. Processing steps involved in computing horizontal velocities in geographic coordinates from the raw data are described, and time series and spectra of the resulting data are presented. Horizontal velocities with precision of about 1 cm/s were obtained between 27.7 m and 137.0 m depth with 7.8 m resolution. Data were obtained at five minute intervals, but averaged to 1 hr during processing to suppress instrument noise. Spectra show the velocity field to be dominated by variance at semi-siurnal frequency, with a maximum in energy between 50 and 110m depth. Maximum amplitudes of 8 to 10 cm/s were seen near 80 m depth. Velocities from an InterOcean S4 current meter deployed at the same site were compared to those from the ADCP. The largest differences were associated with peaks in the semi-diurnal oscilations, with the S4 underspeeding relative to the ADCP.Funding was provided by the Office of Naval Research through Contract No. N00014-90-J-1359
    corecore